Editorial Bias in Crowd-Sourced Political Information
نویسندگان
چکیده
The Internet has dramatically expanded citizens' access to and ability to engage with political information. On many websites, any user can contribute and edit "crowd-sourced" information about important political figures. One of the most prominent examples of crowd-sourced information on the Internet is Wikipedia, a free and open encyclopedia created and edited entirely by users, and one of the world's most accessed websites. While previous studies of crowd-sourced information platforms have found them to be accurate, few have considered biases in what kinds of information are included. We report the results of four randomized field experiments that sought to explore what biases exist in the political articles of this collaborative website. By randomly assigning factually true but either positive or negative and cited or uncited information to the Wikipedia pages of U.S. senators, we uncover substantial evidence of an editorial bias toward positivity on Wikipedia: Negative facts are 36% more likely to be removed by Wikipedia editors than positive facts within 12 hours and 29% more likely within 3 days. Although citations substantially increase an edit's survival time, the editorial bias toward positivity is not eliminated by inclusion of a citation. We replicate this study on the Wikipedia pages of deceased as well as recently retired but living senators and find no evidence of an editorial bias in either. Our results demonstrate that crowd-sourced information is subject to an editorial bias that favors the politically active.
منابع مشابه
InterPoll: Crowd-Sourced Internet Polls
Crowd-sourcing is increasingly being used to provide answers to online polls and surveys. However, existing systems, while taking care of the mechanics of attracting crowd workers, poll building, and payment, provide little to help the survey-maker or pollster in obtaining statistically significant results devoid of even the obvious selection biases. This paper proposes InterPoll, a platform fo...
متن کاملThe Crowd vs. the Lab: A Comparison of Crowd-Sourced and University Laboratory Participant Behavior
There are considerable differences in remuneration and environment between crowd-sourced workers and the traditional laboratory study participant. If crowd-sourced participants are to be used for information retrieval user studies, we need to know if and to what extent their behavior on information retrieval tasks differs from the accepted standard of laboratory participants. With both crowd-so...
متن کاملInterPoll : Crowd - Sourced Internet Polls ( Done Right ) MSR - TR - 2014 - 3 Benjamin Livshits and Todd Mytkowicz
Crowd-sourcing is increasingly being used for providing answers to online polls and surveys. However, existing systems, while taking care of the mechanics of attracting crowd workers, poll building, and payment, provide little that would help the survey-maker or pollster to obtain statistically significant results devoid of even the obvious selection biases. This paper proposes InterPoll, a pla...
متن کاملCrowd-sourced data coding for the social sciences: massive non-expert human coding of political texts
A large part of empirical social science relies heavily on data that are not observed in the field, but are generated by researchers sitting at their desks. Clearly, third party users of such coded data must satisfy themselves in relation to both reliability and validity. This paper discusses some of these matters for a widely used type of coded data, derived from content analysis of political ...
متن کاملCrowd-sourced data coding for the social sciences: massive non-expert coding of political texts
A large part of empirical social science relies heavily on data that are not observed in the field, but are generated by researchers sitting at their desks, raising obvious issues of both reliability and validity. This paper addresses these issues for a widely used type of coded data, derived from the content analysis of political text. Comparing estimates derived from multiple “expert” and cro...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره 10 شماره
صفحات -
تاریخ انتشار 2015